![]() METHOD FOR REPRESENTATION BY ULTRASOUND IMAGING AND ULTRASOUND IMAGING SYSTEM
专利摘要:
systems and methods for enhanced image representation of objects within an image. systems and methods that provide a plurality of different image signals in generating an image structure are shown. a first image signal may be configured to provide relatively high quality images with respect to regions under the surface of living tissue, for example, whereas a second image signal may be configured to provide relatively high quality images with respect to interventional instruments inserted into living tissue at an acute angle. Image substructures generated using each of these different image signals are merged to form a final image structure that provides a relatively high quality image of various objects within the volume being represented by the image. 公开号:BR112012025710B1 申请号:R112012025710-2 申请日:2011-04-06 公开日:2021-07-20 发明作者:Nikolaos Pagoulatos;Qinglin Ma;Andrew K. Lundberg;Richard Hippe;Clinton T. Siedenburg 申请人:Sonosite, Inc; IPC主号:
专利说明:
REFERENCE TO RELATED ORDERS The present application claims priority to the co-pending US Provisional Patent Application with serial number 61/321,666 entitled "Systems and Methods for Enhanced Image Representation of Objects within an Image", filed April 7, 2010, and is related to applications commonly endorsed and co-pending United States patent serial number 11/749,319 entitled "System and Methods for Optimized Spatio-Temporal Sampling", filed May 16, 2007, and serial number 11/854,371 entitled "System and Method for Spatial Compounding Using Phased Arrays," filed September 12, 2007, the disclosure of which is hereby incorporated by reference. TECHNICAL FIELD The invention generally relates to ultrasound imaging and more particularly to enhancement of the visualization of objects within an image. FUNDAMENTALS OF THE INVENTION Ultrasound imaging systems have gained wide acceptance for use in providing images of objects and areas that are not otherwise visible to an observer. Such ultrasound imaging systems are typically configured with multiple imaging parameters selected to produce the best ultrasound image quality rather than the best visualization of individual objects that may be present in a volume being imaged. As a result, the viewing of individual objects is typically adjusted to achieve satisfactory full ultrasound image quality. Objects visualized and represented in ultrasound images can comprise biological structures such as human tissues and organs, and human-made structures such as implantable devices, instruments, etc. Various biological and human-made structures may require specific imaging parameters to achieve high quality ultrasound visualization of structures that are different from the parameters selected to achieve total image quality. Furthermore, the imaging parameters chosen to achieve high quality visualization of one type of structure may be significantly different from the parameters chosen to achieve high quality visualization of a different type of structure. Therefore, it is not a simple task to provide high quality visualization of one or more individual objects within a total high quality ultrasound image. It is now common practice to use ultrasound imaging systems to aid in the orientation and placement of instruments and other human-made objects. For example, interventional instruments such as needles, catheters, etc., can be used to release medication or other fluids directly into a nerve, artery, or vein deep in or within a patient's body. Such procedures may require accurate positioning of an internal instrument in a patient, thus requiring high-quality ultrasound visualization of both biological structures and human-made instruments. Employing ultrasound imaging systems configured with selected imaging parameters to optimize the quality of the entire image is often difficult, and sometimes impossible, to provide adequate visualization of instruments inserted at an acute angle with respect to a Ultrasonic transducer used to generate an ultrasound image. The problem of inadequate visualization of instruments inserted at acute angles results, at least in part, from the fact that the representations of such instruments in ultrasound images are based on ultrasound echoes that are specularly reflected from the instruments. Specular reflection principles indicate that for acute insertion angles the reflected ultrasound echoes from the instruments do not cross the ultrasonic transducer elements to produce a clear representation of the instruments in the resulting ultrasound image. As a result of the generally inadequate representation in ultrasound images of instruments inserted at acute angles, a clinician must almost always rely on secondary artifacts to visualize or “guess” where the intervention instrument is within a volume (eg, within the patient's anatomy). For example, a clinician may rely on movement of tissue or other visible structures within the resulting image, caused by pressure from a needle as the needle is inserted or otherwise moved, to visualize where the needle is. within the patient's anatomy. Visualizing the location of an intervention instrument based on the movement of adjacent structures generally does not provide accurate location determinations. Another technique used to visualize the location of an interventional instrument requires injecting fluid through the interventional instrument and observing the resulting image as the fluid moves through the middle of the volume being imaged (eg, à as the liquid moves into and through the tissue). This technique, therefore, is also based on secondary artifacts and has not been proven to be particularly satisfactory. Several specially shaped echogenic needles were introduced to correct the problem of inadequate visualization of instruments inserted at acute angles. These special needles are typically designed and constructed so that the ultrasound waves reflected from the needle reach the ultrasound transducer elements even when the needle is inserted at acute angles. However, there are several factors that reduce the effectiveness and acceptance of such needles. For example, the high cost associated with these special needles reduces their clinical acceptance and widespread use. BRIEF SUMMARY OF THE INVENTION The present invention is directed to systems and methods that provide ultrasound imaging signals to generate ultrasound images. For example, various ultrasound imaging signals, where each image signal is related to an object in an image-represented volume, can be used in accordance with embodiments of the invention. Using such ultrasound imaging signals, an ultrasound imaging system can provide enhanced or optimized images of various objects in the imaged volume. In accordance with embodiments of the present invention, an ultrasound image signal comprises one or more ultrasound image parameter values, each image parameter being associated with acquisition or processing of ultrasound data. Each image signal from a plurality of image signals can be associated with the same ultrasound modality, such as B-mode, color-flow, power-doppler, elastography, and others. However, each modalities ultrasound image signal has one or more sets of image parameters at values prepared for high quality ultrasound visualization of a particular object of interest. The ultrasound image signals of the embodiments of the invention are optimized or otherwise configured to represent images of certain objects, structures, features, etc. of a volume being represented by images. For example, a first image signal can be configured to provide relatively high quality images with respect to regions under the surface of living tissue (eg, general anatomy of the patient), whereas a second image signal can be configured to provide relatively high quality images with respect to interventional instruments (eg, a needle) inserted into living tissue at an acute angle. Subframes generated using different ultrasound image signals are preferably combined to form a frame that provides a relatively high quality image of the various structures, attributes, aspects, etc. collectively referred to as objects (eg, general anatomy of the patient and interventional instrument) within the volume being imaged. Frames formed according to the modalities are preferably combined or merged to form a final image. According to a preferred embodiment, two ultrasound image signals are used in which one ultrasound image signal is prepared for the purpose of providing high image quality to human tissues and another is prepared for the purpose of providing high image quality for interventional instruments, such as needles, inserted at acute insertion angles. According to aspects of the embodiments, an ultrasound image signal of the two preceding ultrasound image signals comprises a predetermined set of steering angles specifically intended for high quality interventional instrument viewing at acute angles of insertion. According to other aspects of the modalities, one or more ultrasound image parameter values are changed between the two ultrasound image signals, wherein one or more of these ultrasonic parameters include waveform transmission parameters, transmission ratio for receiving beam lines, steering angle, receiving linear density, number of focal zones, location of focal zones, quadrature bandpass filter type and coefficients, compression curve, spatter reduction, etc. Other embodiments of the invention operate to identify an area or block of interest within a frame for blending or merging with one or more other frames forming a final image. For example, a block in which an intervention instrument is to be arranged can be known or determined. Thus, embodiments of the invention may produce, or otherwise render unimportant, out-of-block portions prior to merging a frame of the intervening instrument with a frame of an anatomical structure at the time of final image formation. Such modalities can be used to smooth out or prevent blurring, artifacts, etc. of the image in association with the use of an intervention instrument image signal. The foregoing has in particular broadly outlined the technical characteristics and advantages of the present invention so that the following detailed description of the invention may be better understood. Additional features and advantages of the invention will be described below and constitute the purpose of the claims of the invention. It should be noted by those with practical experience that the concept and specific embodiment described can be readily used as a basis for modification or development of other structures in achieving the same purposes as the present invention. It should be understood by those with practical experience that such equivalent constructions do not depart from the spirit and scope of the invention as demonstrated in the appended claims. The novel aspects believed to be characteristic of the invention, both for its organization and method of operation, along with other objectives and advantages will be more properly understood on the basis of the following description when considered in connection with the accompanying figures. It should be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only, and is not intended as a definition of the limits of the present invention. BRIEF DESCRIPTION OF THE DRAWINGS For a more complete understanding of the present invention, reference is now made to the following descriptions considered in conjunction with the accompanying drawings, wherein: FIGURES 1A and 1B show an embodiment of an ultrasound imaging system adapted in accordance with an embodiment of the invention; FIGURES 2A-2C show different subframes used in accordance with an embodiment of the invention; FIGURE 2D shows a frame generated from the subframes of FIGURES 2A-2C in accordance with an embodiment of the invention; FIGURES 3A and 3B show different subframes according to an embodiment of the invention; FIGURE 3C shows a frame generated using subframes of the ultrasound image signal of FIGURES 3A and 3B in accordance with an embodiment of the invention; FIGURE 4 shows a final ultrasound image generated using the frames of FIGURES 2D and 3C in accordance with an embodiment of the invention; FIGURE 5 shows a schematic diagram of operation of the ultrasound imaging system of FIGURES 1A and 1B operating to generate a final ultrasound image using a multiple ultrasound imaging signal technique of an embodiment of the invention; FIGURE 6 shows a high level operating flow diagram of the ultrasound imaging system of FIGURES 1A and 1B operating to generate a final ultrasound image employing a multiple ultrasound imaging signal technique of an embodiment of the invention ; FIGURE 7 shows a coordinate system for detecting straight lines in subframes in accordance with embodiments of the invention; and FIGURE 8 shows graphs in a final image indicating the coverage area defined by the highly guided subframe of embodiments of the invention. DETAILED DESCRIPTION OF THE INVENTION FIGURE 1A shows an ultrasound imaging system adapted in accordance with an embodiment of the invention. Specifically, shown is the ultrasound imaging system 100 comprising system unit 110 coupled to transducer 120. System unit 110 of the embodiments comprises a processor-based system operable that controls transducer 120 to transmit and receive ultrasound signals, processing and receiving ultrasound signals, generating an image employing the processed received ultrasound signals, and displaying the generated image (eg, on display 111). Transducer 120 comprises a formation of ultrasound elements operable to controllably transmit and receive ultrasound signals. Detail regarding imaging systems that can be adapted in accordance with the concepts of the present invention is provided pending concomitant and commonly endorsed with US patent application serial number 12/467,899 entitled "Modular Apparatus for Diagnostic Ultrasound", which disclosure is hereby incorporated by reference. In operation, the ultrasound imaging system 100 performs an imaging technique known as "spatial composition" in which multiple angles guided by different signals are used to illuminate a volume being imaged. Further detail regarding spatial composition techniques is found in the aforementioned patent applications entitled "System and Method for Optimized Spatio-Temporal Sampling" and "System and Method for Spatial Compounding Using Phased Arrays". Using typical spatial composition techniques, data collected with respect to a volume being imaged using a single steering angle is processed to form a subframe, and all subframes of the imaged volume are then compounds to produce a frame. A frame can be formed, for example, using two subframes, three subframes, four subframes or more, corresponding to employing two, three, four or more steering angles respectively. FIGURES 2A-2D illustrate the use of three subframes in generating a frame. Table 200 of FIGURE 2D, for example, illustrates an upper-lower view of the volume being imaged, such as might comprise living tissue from a patient's anatomy. In the illustrated example, frame 200 of FIGURE 2D may be generated by ultrasound imaging system 100 using subframes 201-203 of FIGURES 2A-2C. Subframe 201 of the illustrated embodiment provides a flat undercut (e.g., 0° steering angle) with respect to transducer 120 and provides subframe data for the undirected subframe. Subframes 202 and 203 of the illustrated embodiment provide offset direction angles. For example, the steering angle for frame 202 can be steered left by approximately -14° and the steering angle for subframe 203 may be steered to the right by approximately +14°. The data provided by these subframes is combined to generate frame 200 with improved image characteristics over a frame generated using a single steering angle (eg, a frame based only on the data available for subframe 201 ). The resulting frame 200 produces a higher desired image quality by, for example, decreasing shading effects, reducing splash noise, and improving boundary demarcation. Ultrasound imaging systems that employ spatial composition are typically configured with multiple image parameters selected to produce the best overall ultrasound image quality, rather than the best visualization of individual objects within a volume being imaged. In general, the term “image parameters” refers to parameters associated with the acquisition or processing, or both, of ultrasound data. Examples of such image parameters comprise waveform transmission parameters, baud rate for receiving lines formed by beams, image direction angle, receiving linear density, number of focal zones, location of focal zones, type and coefficients of bandpass quadrature filter, compression curve and splash reduction, among others. As image parameters are typically selected to produce the best overall image quality, the viewing quality of individual objects within the volume being imaged may be compromised. However, from a clinical point of view it is beneficial to present a clinician with a unique ultrasound image in which there is high quality visualization of all objects of interest, including different biological structures and human-made objects. An exemplary case that illustrates this problem is the visualization of instruments, such as the intervention instrument 130 which comprises needles, catheters etc. when inserted at relatively sharp angles with respect to transducer 120 (as shown in FIGURE 1B). For example, a portion of the interventional instrument 130, such as may comprise a needle, catheter, stent, percutaneous instrument, etc., is inserted into the volume being imaged 101 to perform an interventional procedure. Similar procedures may require injecting pharmaceutical substances into target 102, performing a biopsy related to target 102, etc. In general, when the angle of insertion (represented by angle α in FIGURE 1B) is approximately 20° or greater, it is considered an acute angle of insertion. When interventional instruments are inserted at acute angles, the image direction angles of the subframes configured to form the image of the target 102 may not provide a satisfactory image of the interventional instrument 130. For example, specular reflection effects may cause the signal reflected from the intervention instrument is not readily visible in the resulting image. The failure to provide a satisfactory image with respect to the intervention instrument 130 is particularly problematic where the ultrasound system 100 is being used to generate images that facilitate a clinician's performance in an intervention procedure using the intervention instrument 130. While it may appear that the preceding spatial compositing technique can be modified to include one or more subframes configured to image the intervention instrument 130, the various different image signal direction angles of the subframes can be composited using techniques Spatial compositing are constrained if image quality is to be maintained or improved. For example, it has been found that the use of subframes that provide sharper steering angles (eg, greater than ± 15° as may be desirable for use in the image of an intervention instrument inserted at an acute angle) in combination with subframes of the preceding image signal that provide less acute steering angles results in image degradation. That is, particular subframes are not compatible for the generation of a frame using spatial composition techniques because their composition results in blurring, artifacts, etc. unwanted or unacceptable images. “Compatible”, as used with reference to image parameters, means that several image parameters (eg optimized or otherwise configured for use with respect to a particular object) can be employed together (eg ., in a spatial composition technique) without resulting in blurring, artifacts, etc. unwanted or unacceptable images. On the other hand, “not supported” means that image parameters (eg optimized or otherwise configured for use with respect to a particular object) may result in blurring, artifacts, etc. Undesired or unacceptable image parameters when employed against other image parameters (eg, optimized or otherwise configured for use with other objects) and thus may be considered not compatible with these other image parameters. To achieve high quality ultrasound visualization of particular objects within an image-represented volume, embodiments of the present invention employ ultrasound image signals that comprise one or more of the ultrasound image parameter values that are intended to produce ultrasonic visualization of high quality of objects of particular interest. These ultrasound image signals comprise image parameters associated with the acquisition and/or processing of ultrasonic data such that the parameter values are intended to provide high quality ultrasonic visualization of a particular object of interest. In a preferred embodiment, a particular ultrasound image signal is set for an object of interest without compromising the visualization of other objects being examined. In accordance with embodiments of the present invention, the particular subframes used in generating a frame are selected so that they are optimized or otherwise configured to image particular objects in accordance with the ultrasound image signal defined for a given object. For example, this collection of subframes can be configured to provide relatively high quality images with respect to regions under the surface of living tissue (eg, patient's general anatomy), particular tissues (eg, cardiac , hepatic, uterine, gastric etc.), particular structures (eg bone joints, arterial bifurcations, nerve bundles etc.) or human-made objects such as implants and instruments and/or the equivalent. In accordance with embodiments of the present invention, the collection and processing of subframes, such as subframes 201-203, and their combination to produce a frame, such as frame 200, are associated with an ultrasound image signal related to the target. 102. Therefore, the ultrasound imaging system embodiments of the present invention are operable to provide a plurality of different ultrasound image signals wherein each ultrasound image signal of the plurality of included image signals provides high quality visualization of a corresponding object. of interest within a examined volume. In accordance with an embodiment of the invention, a first ultrasound image signal (e.g., comprising frames 201-203 of FIGURES 2A-2C) is configured to provide relatively high quality images with respect to regions under the surface of the volume being represented by image 101. Correspondingly, a second ultrasound image signal (e.g., comprising subframes 301-302 of FIGURES 3A-3B) is configured to provide relatively high quality images with respect to a portion of the intervention instrument 130 inserted at an acute angle in the volume being represented by image 101. Subframes 301 and 302 of FIGURES 3A and 3B are configured to provide relatively high quality images relative to intervention instrument 130, inserted at an acute angle, by introducing sharper steering angles to provide image data that reduce the disadvantages reflection spectra associated with the intervention instrument 130. As discussed, the intervention instrument 130 of the illustrated modality has a relatively acute angle to the face of the transducer 120 and is visible in table 300 of FIGURE 3C because of the direction angles of hair. at least one of subframes 301 and 302 are highly directed in one or more appropriate orientations. It should be noted that, for clarity, artifacts of the item to be imaged 101 (eg, fabric and other structure) are not shown in subframe 300. The characteristics of the item to be imaged 101 would be substantially degraded due to this blurring or other factors associated with the use of subframes 301 and 302. In the configuration of subframes 301 and 302 in accordance with embodiments of the invention, the steering angles can be selected to substantially match the insertion angle of the intervention instrument 130 so that they provide more normal signal transmission to the surface of the intervention instrument 130 By way of example, the intervention instrument 130 can be inserted at an angle of approximately 60° and correspondingly the included steering angles with respect to subframes 301 and 302 can be ± 30° so that they provide transmission of acoustic waves that result in approximately an angle of incidence of 90° with the face of the intervention instrument 130. It should be noted that other angles of incidence, with the exception of 90°, may be used according to the embodiments of the invention. Preferred modalities operate by providing angles of incidence in the range of 75° to 105° with respect to a surface of an object to provide a high quality image of the object. It should be noted that the insertion angle of an intervention instrument being imaged can be known or assumed for setting and/or selecting parameter values for an image signal suitable for use at the same time by various means. For example, an intervention instrument guide can be used to provide a particular insertion angle. The insertion angle provided by a guide of the selected interventional instrument can be automatically provided by a processor of the system unit 110 through various sensors or other feedback techniques, such as those shown in the US patent application, serial number 11/216,735, entitled "Medical Device Guide Locator", filed August 31, 2005, granted to Sonosite, Inc., the assignee of the present invention, the disclosure of which is hereby incorporated by reference. When an interventional instrument is inserted freehand (eg without the aid of a guide), an insertion angle used for setting and/or selecting an appropriate image signal can be estimated as the intended insertion angle , a typical insertion angle, a calculated insertion angle, etc. Additionally or alternatively, an insertion angle can be determined by analyzing collected data (eg, image data), such as by means of identifying attributes of an intervention instrument within a generated image. The ultrasound image signals used in generating the final image can have different numbers and types of image parameters. For example, the first of the preceding model image signals comprises 2 subframes. There is no limitation with respect to the illustrative number of subframes and other image parameters and therefore the image signals of the modalities may include fewer or more subframes than illustrated. For example, an image signal configured for use with respect to a particular object or feature, such as an ultrasound image signal configured for use with respect to intervention instrument 130, may include a single subframe in accordance with embodiments of the invention . While ultrasound image signal modalities comprise subframes that include various direction angles, embodiments of the invention may utilize ultrasound image signals that comprise a number of additional or alternative image parameters. For example, modalities ultrasound image signals can provide imaging parameter specification settings that include waveform transmission, baud rate for receiving beamlines, steering angle, receiving linear density, number of zones focal zones, location of focal zones, quadrature bandpass filter type and coefficients, compression curve, dab reduction parameters, etc., to highlight the objects of interest. Regardless of the particular ultrasound image signals used, the modalities system unit 110 operates by generating image frames (e.g., frames 200 of FIGURE 2D and 300 of FIGURE 3C) by means of processed received signals with respect to each one of the evoked ultrasound image signals. Such image frames formed according to the modalities are preferably combined or merged to produce a final image. FIGURES 5 and 6 illustrate operation of ultrasound system 100 in generating final image 400 using a plurality of image signals and the image frames generated thereby, as described herein. As shown in block diagram 600 of the embodiment illustrated in FIGURE 6, a first ultrasound image signal for imaging a first object (e.g., object 102) is established in block 601. For example, image signals of ultrasound 502 of FIGURE 5 (comprising subframes 201-203) can be established for use in imaging a first object. In addition, a second ultrasound image signal for imaging a second object (e.g., intervention instrument 130) is established in block 602 of the illustrated embodiment. For example, ultrasound image signal 503 of FIGURE 5 (comprising subframes 301-302) may be established for use in imaging a second object. In block 603 of the illustrated embodiment, image data is collected and processed using the first ultrasound image signal. For example, system unit 110 and transducer 120 can cooperate to input the first ultrasound image signal and collect and process image data. Furthermore, in block 604 of the illustrated embodiment, image data is collected and processed using the second ultrasound image signal. For example, system unit 110 and transducer 120 can cooperate to input the second ultrasound image signal and collect and process image data. As shown in the embodiment of FIGURE 5, data collected by transducer 120 operating in accordance with ultrasound image signal 502 is provided to system unit 110. The subframes collected and processed by means of the first image signal are further processed. s by system unit 110 to provide a frame that includes the first object in block 603 of the illustrated embodiment. For example, the processing of the spatial component 511 of the modalities, as it may comprise a processor operating under control of a set of instructions defining the operation as described herein, is operable with respect to the data collected by means of the ultrasound image signal. 502 to generate frame 200 comprising a high quality image of target 102. Correspondingly, data collected by the transducer 120 operating in accordance with the ultrasound image signal 503 is provided to the system unit 110. The subframes collected and processed by means of the second ultrasound image signal are further processed by the system unit 110 to provide a frame including the second object in block 604 of the illustrated embodiment. For example, detection 512 of the intervention instrument, as it may be provided by algorithms operable in a system unit processor 110, is operable with respect to data collected by means of ultrasound image signal 503 to generate a frame 300 that comprises a high quality image of a portion of the intervention instrument 130. The modalities intervention instrument detection 512, as it may comprise a processor operating under control of a set of instructions defining the operation as described herein, provides the operation for detecting the intervention instrument 130 or other objects of interest. For example, intervention instrument detection algorithms 512 may analyze data collected by transducer 120 to identify attributes of detection instrument 130 in this manner for configuration and/or selection of a particular ultrasound image signal for use in providing an image thereof. high-quality. Additionally or alternatively, the intervention instrument detection modalities 512 operate in identifying the intervention instrument 130 within one or more subframes, such as subframes 301 and 302, in order to provide isolation of the intervention instrument to generate frame 300 where the intervention instrument and its surrounding areas are shown. For example, the shape or other features of the intervention instrument 130 may be known (i.e., an intervention instrument in the form of a needle has a known shape, i.e., a linear segment) and may be readily identified using a appropriate algorithm. In one embodiment, the Hough Transform mathematical model is used to detect the intervention instrument. This is a well-known model for detecting lines and any curves that can be expressed in a parametric form. Through this method, the object of interest can be modeled as a straight line or a parametric curve and the algorithm would determine where the object is within the subframe, such as subframes 301 and 302. Regardless of the particular technique used, the detection operation 512 of the modalities intervention instrument provides a segmentation of the intervention instrument 130 and its immediate surrounding area, the result of which is mask 530. The segmentation result, like mask 530, makes it possible the isolated use of the intervention instrument 530 and its surrounding area in generating the final image 400 without degrading the tissue image (eg, as given in table 102). As mentioned previously, the Hough Transform for detecting straight lines is used with subframes 301 and 302 according to the modalities. Based on a use of the Hough Transform and drawing attention to FIGURE 7, the line 701 in the 700 coordinate system {x,y} can be expressed by the equation: p = x • cosθ + y • sinθ (1) where p is the line distance from the origin of the coordinate system 700 {x,y} and θ is the angle between the line that is perpendicular to the line of interest 701 and the x axis of the {x,y} coordinate system. During a Hough Transform initialization process according to modalities, a 2D order containing the subframe pixels in the {x,y} coordinate system and a second 2D order called the accumulator order are defined. Each cell in the accumulating order corresponds to a particular set of parameters (p0,θ0) that represents a single row in the processed subframe, as shown by equation (1). The size of the 2D accumulator order depends on the range of p distances and shorter θ angles that are of interest and the resolution by which they are defined. Once initialization is complete, the main mechanism of the Hough Transform is to determine the sequence through all the pixels of a subframe and, for each pixel (x0,y0) that satisfies a set of criteria, such as intensity threshold, power of the gradient etc., a counter is incremented in accumulator order for all cells (p, θ) that satisfy equation (1) for each pixel (x0,y0). In the operation according to the modalities, after the sequence of the entire subframe is determined, the cell in the accumulating order with the highest value of the counter corresponds to the 2D line in the image subframes 301 or 302 representing the intervention instrument. For further identification of the particular segment along the 2D line where the intervention instrument is located, the intensity and magnitude of the gradient, parallel to the direction perpendicular to the detected line, is examined for each pixel over the entire length of the 2D line identified in the subframe according to the modalities. The pixels along the detected 2D line of which both the aforementioned intensity and magnitude of the gradient exceed a certain threshold define the particular location of the intervention instrument along the entire length of the detected 2D line in the subframe and can be used to define the mask 530 in frame 300 shown in FIGURE 5 (eg, a margin, such as a margin of a predetermined number of pixels, a predetermined distance, a percentage of the object's size, etc., can be used around the object to define the mask). According to an embodiment, detection 512 of the intervening instrument may include a pre-processing step to remove artifacts and noise from the subframe, before the Hough Transform or other object identification technique is applied. It should be noted that, although shown as distinct steps in the illustrated embodiment, the processing of the spatial component 511 and the detection 512 of the intervening instrument can be provided in a combined circuit according to the embodiments. For example, the same processor may operate under the control of a set of instructions that define the processing operation of the spatial component 511 and a set of instructions that define the detection operation 512 of the intervening instrument, if desired. Having generated subframes 200 and 300, each providing a high quality image with respect to different objects in the area to be represented by image 101, the system unit 110 of the illustrated embodiment uses frame merging 513 to merge frames 200 and 300 with the aim of forming the final image 400 providing a high quality image of various objects within the volume being represented by image 101 in block 605. For example, mask 530 of frame 300 which defines the proper location of the instrument of intervention 130 can be merged into frame 200 to form the final image 400. In the illustrated embodiment, frames 200 and 300 are aligned based on the acquisition and processing used to generate them, while mask 530 serves as a means of identify the portion of frame 300 that will be merged with frame 200. For pixels located outside the region identified by mask 530, the pixel values in the final image 400 are identical to those of table 200 according to the modalities. For each pixel location within mask 530, the resulting pixel value for final image 400 can be a combination of the corresponding pixel values found in frames 200 and 300. The merge function can be established by merging the frame 513 to provide the preceding merging of frames to generate a final image. Assuming that a pixel (x0,y0) is within the region identified by mask 530 and that f1(x0,y0), f2(x0,y0) and f(x0,y0) are the pixel values of frames 200 and 300 and from the final image 400 respectively, we can write the following equation of the merge function: f(x0,y0) = (1-b) * f1(x0,y0) + b * f2(x0,y0) (2)where b is a coefficient between 0 and 1 that can be constant, such as 0.5, to produce an average of two pixel values. In another embodiment, coefficient b may be a function of location within the mask where coefficient b should have higher values for locations on mask 530 closer to intervention instrument 130 and lower values for locations on mask 530 further away from instrument of intervention 130. Alternatively, instead of applying a linear operation shown by equation (2), a non-linear operation can be used in a modalities merge function. In an example of such a case, f(x0,y0) can be the maximum of the values f1(x0,y0) and f2(x0,y0). Although only two ultrasound image signals are shown with respect to the illustrated modality, it should be noted that any number of ultrasound image signals can be used as appropriate for particular modalities. For example, where 3 objects of interest (eg, a nerve, an intervention instrument, and a main or underlying volume) are to be imaged, modalities can provide 3 ultrasound imaging signals (eg, ., configured for each of the nerve, intervention instrument and corresponding main volume) to generate fused frames that form the final image. By generating and merging the frame according to the modalities of the invention, a final image is generated that provides high quality of objects, substantially free from blurring, artifacts, etc. unwanted or unacceptable images when using image rendering parameters that are otherwise not compatible. For example, employing an ultrasound image signal frame technique of the embodiments of the invention, the image is not degraded although image processing is performed to reproduce an interventional instrument that is inserted at an acute angle clearly visible in the resulting image. Operating the system unit 110 in accordance with the modalities can provide image processing in addition to the preceding spatial component, intervening instrument detection and frame merging. For example, embodiments of the invention can provide dab reduction and/or other types of image processing algorithms with respect to the frame and/or the final image. Additionally or alternatively, with attention being drawn to FIGURE 8, the operation of system 110 of the modalities includes graph 800 (shown here as a trapezoid indicated by dashed lines) in final image 400, such as to indicate coverage area 801 Graph 800 can be controlled, for example, by parameters used to generate frame 300, to indicate coverage area 801 defined by highly targeted frame 301 in FIGURE 5. Coverage area 801, as indicated by graph 800, comprises an area where high-quality visualization of the intervention instrument is made possible. This graphic allows the clinician or other operator to know which items of interest will only appear with high quality visualization in certain portions, such as the 801 coverage area of the final image, and, if this does not occur, that is, if they appear in the region 802 of the final image 400, so care must be taken because the object (such as the intervention instrument) is outside the high-quality field of view of the incidence (although the intervention instrument may nevertheless be within the volume being represented by Image). If desired, certain attributes (such as endpoint, midpoint etc.) of the intervention instrument or other targets can be coded so that different portions appear differently in the final image (eg different intensities, different colors etc.). ), in order to alert the clinician or other operator to what portion of an object (eg, interventional instrument) is being overlapped. Such image processing can be provided by merging 513 of the modalities frame, for example. It should be noted that the application of a multiple signal ultrasound imaging technique of the embodiments of the invention is not limited to its use with interventional instruments. For example, an ultrasound image signal can be configured to image a nerve, while another ultrasound image signal is configured to image another object, such as a tumor, whereby these ultrasound image signals are used in cooperation to form frames that are fused together in order to provide a final image that displays high image quality of each object. While the present invention and its advantages have been described in detail, it is to be understood that various changes, substitutions and modifications may be made without departing from the principle and scope of the invention as defined by the appended claims. However, the purpose of the present application is not intended to be limited to the particular modes of process, machine, fabrication, subject composition, means, methods and steps described in the specification. As those with usual practical experience will readily understand from the description of the present invention, the processes, machines, fabrication, subject compositions, means, methods or steps existing at present or which will be further developed and which they carry out substantially the same function or achieve substantially the same result as the corresponding embodiments described herein can be used in accordance with the present invention. Therefore, the appended claims are intended to include within their scope such processes, machines, fabrication, subject composition, means, methods or steps.
权利要求:
Claims (10) [0001] 1. Method for ultrasound imaging, characterized by comprising: establishing a first ultrasound image signal (502) with one or more sets of imaging parameters including a transmission waveform, baud rate for receiving formed lines per beam, image direction angle, linear density pickup, number of focal zones, bandpass quadrature filter type and coefficients, compression curve, and dapple reduction parameters selected to produce a high-quality tissue image within a volume to be imaged; establishing a second ultrasound image signal (503) with one or more sets of image parameters including a transmission waveform, baud rate for receiving lines formed by beams, angle of image direction, receiving linear density, number of focal zones, type and quadrature bandpass filter coefficients, c selected compression curve and dab reduction parameters to produce a high quality image of an instrument (130) within the volume to be imaged; and generating a first frame (200) using the image data resulting from applying the first ultrasound image signal; and generating a second frame (300) using image data resulting from applying the second ultrasound image signal; detecting pixels in the second frame that correspond to the intervening instrument; automatically determining a mask (530) of corresponding pixels the location of the intervention instrument; the blending of pixels from the first frame and the second frame in a region that is inside the mask and using pixels from the first frame in a region that is outside the mask to form a final blended ultrasound image (400 ); and producing a graphical indication (800) in the final blended ultrasound image that indicates the coverage area boundaries in the final blended ultrasound image where one or more of the image parameters of the second ultrasound image signal is selected to produce an image quality of the intervention instrument. [0002] 2. Method according to claim 1, characterized in that the intervention instrument (103) is selected from the group consisting of a needle, a catheter, a stent and a percutaneous instrument. [0003] 3. The method of claim 1, characterized in that the image direction angle of the first ultrasound image signal (502) is selected to provide tissue image with desired quality and the image direction angle of the second ultrasound image signal (503) is selected to provide image of the intervention instrument with desired quality. [0004] 4. Method according to claim 3, characterized in that the image direction angle of the second ultrasound image signal is more acute than the image direction angle of the first ultrasound image signal. [0005] 5. System according to claim 4, characterized in that the first image direction angle of the first ultrasound image signal comprises a direction angle magnitude of not greater than 20 degrees and wherein the second direction angle The image of the second ultrasound image signal comprises a steering angle magnitude greater than 20 degrees. [0006] 6. Ultrasound imaging system, characterized in that it comprises: an ultrasound imaging system configured to produce: a first frame (200) from one or more subframes that are acquired with a first image signal (502) with one or more image parameters including a transmit waveform, baud rate to receive beamlines, image direction angle, receive linear density, number of focal zones, quadrature bandpass filter type and coefficients, selected compression curve and dab reduction parameters to produce a high quality image of tissue within a volume to be imaged; and a second frame (300) from one or more subframes that are acquired with a second image signal (503) with one or more image parameters including a transmission waveform, baud rate for receiving beamlines, image direction angle, linear density pickup, number of focal zones, bandpass quadrature filter type and coefficients, compression curve, and selected dab reduction parameters to produce a high quality image of an intervention instrument (130) within the volume to be imaged; and a system based on a processor operable to detect pixels in the second frame that correspond to the intervention instrument; determining a mask (530) of pixel coordinates corresponding to the location of the intervention instrument, wherein the mask is defined by a region in the second table including the detected pixels corresponding to the intervention instrument and a number of pixels immediately surrounding the detected pixels; ecombine a first set of pixels from the first frame and a second set of pixels from the second frame in an area that matches the mask and use pixels from the first frame in an area that is outside the mask to form a final blended ultrasound image (400) which comprises the tissue and the instrument of intervention; and produce a graph (800) in the final blended ultrasound image that indicates the coverage area boundaries in the final blended ultrasound image where one or more of the image parameters of the second ultrasound image signal is selected to produce an image of the ultrasound instrument. intervention. [0007] 7. System according to claim 6, characterized in that the intervention instrument (130) is selected from the group consisting of a needle, a catheter, a stent and a percutaneous instrument. [0008] 8. System according to claim 6, characterized in that the direction angle of the image signal of at least one of the first and second image signals (502, 503) is determined during system operation. [0009] 9. System according to claim 6, characterized in that the image direction angle of the second image signal (503) is more acute than the image direction angle of the first image signal 10 (502) . [0010] 10. System according to claim 9, characterized in that the image direction angle of the first image signal (502) comprises a direction angle magnitude of not greater than 20 degrees and wherein the direction angle of The second image signal image (503) comprises a steering angle magnitude greater than 20 degrees.
类似技术:
公开号 | 公开日 | 专利标题 BR112012025710B1|2021-07-20|METHOD FOR REPRESENTATION BY ULTRASOUND IMAGING AND ULTRASOUND IMAGING SYSTEM Uherčík et al.2010|Model fitting using RANSAC for surgical tool localization in 3-D ultrasound images US8559685B2|2013-10-15|Robust and accurate freehand 3D ultrasound KR20090098842A|2009-09-17|Improved image registration and methods for compensating intraoperative motion in image-guided interventional procedures JP2011502687A|2011-01-27|Interventional navigation using 3D contrast ultrasound US20070167762A1|2007-07-19|Ultrasound system for interventional treatment Dencks et al.2018|Clinical pilot application of super-resolution US imaging in breast cancer JP2013031660A|2013-02-14|Method and apparatus for processing medical image, and robotic surgery system using image guidance US20090326373A1|2009-12-31|Method for assisting with percutaneous interventions WO2005092198A1|2005-10-06|System for guiding a medical instrument in a patient body Beigi et al.2015|Needle trajectory and tip localization in real-time 3-D ultrasound using a moving stylus Housden et al.2012|Evaluation of a real-time hybrid three-dimensional echo and X-ray imaging system for guidance of cardiac catheterisation procedures EP2059173B1|2017-01-04|System and method for measuring left ventricular torsion Hrinivich et al.2016|Three-dimensional transrectal ultrasound guided high-dose-rate prostate brachytherapy: A comparison of needle segmentation accuracy with two-dimensional image guidance US20070265518A1|2007-11-15|Method and arrangement for locating a medical instrument at least partly inserted into an object under examination DE102009007216A1|2010-08-12|Blood pump e.g. right ventricular impella blood pump, for insertion into heart of patient, has position sensor for determining position and/or location of pump in patient's body, where pump is connected to catheter at proximal end US20130261434A1|2013-10-03|Method and apparatus for indicating medical equipment on ultrasound image Song et al.2016|Morphologic assessment of the left atrial appendage in patients with atrial fibrillation by gray values–inverted volume-rendered imaging of three-dimensional transesophageal echocardiography: A comparative study with computed tomography Uherčík et al.2009|Needle localization methods in 3D ultrasound data Zenbutsu et al.2013|3D ultrasound assisted laparoscopic liver surgery by visualization of blood vessels Khoshhal2013|Feasibility and effectiveness of three-dimensional echocardiography in diagnosing congenital heart diseases Liu et al.2013|Automatic probe artifact detection in MRI-guided cryoablation Khullar2012|Needle Detection And Localization In Simulated In-vivo Ultrasound Image For Use In Breast Biopsy
同族专利:
公开号 | 公开日 US9895133B2|2018-02-20| EP2555683A1|2013-02-13| WO2011127191A1|2011-10-13| EP2555683B1|2019-12-11| CA2796067A1|2011-10-13| JP6462164B2|2019-01-30| BR112012025710A2|2019-11-12| JP2013523343A|2013-06-17| CA2796067C|2018-09-11| JP6322321B2|2018-05-09| JP2016052608A|2016-04-14| US20150094565A1|2015-04-02| JP2018065010A|2018-04-26| CN103237499A|2013-08-07| US20110249878A1|2011-10-13| JP5972258B2|2016-08-17| JP2017154015A|2017-09-07| EP2555683A4|2015-08-26| JP2018065011A|2018-04-26| CN103237499B|2015-03-25| JP6258367B2|2018-01-10| US8861822B2|2014-10-14|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JPS5552749U|1978-10-04|1980-04-08| JPS577739B2|1978-10-16|1982-02-12| JP3077343B2|1991-12-27|2000-08-14|株式会社島津製作所|Ultrasound diagnostic equipment| US6048312A|1998-04-23|2000-04-11|Ishrak; Syed Omar|Method and apparatus for three-dimensional ultrasound imaging of biopsy needle| US6210328B1|1998-10-01|2001-04-03|Atl Ultrasound|Ultrasonic diagnostic imaging system with variable spatial compounding| US6544177B1|1998-10-01|2003-04-08|Atl Ultrasound, Inc.|Ultrasonic diagnostic imaging system and method with harmonic spatial compounding| US6524247B2|2001-05-15|2003-02-25|U-Systems, Inc.|Method and system for ultrasound imaging of a biopsy needle| KR20030058423A|2001-12-31|2003-07-07|주식회사 메디슨|Method and apparatus for observing biopsy needle and guiding the same toward target object in three-dimensional ultrasound diagnostic system using interventional ultrasound| US7534211B2|2002-03-29|2009-05-19|Sonosite, Inc.|Modular apparatus for diagnostic ultrasound| US6951542B2|2002-06-26|2005-10-04|Esaote S.P.A.|Method and apparatus for ultrasound imaging of a biopsy needle or the like during an ultrasound imaging examination| US6790181B2|2002-09-13|2004-09-14|Acuson Corporation|Overlapped scanning for multi-directional compounding of ultrasound images| JP2004214985A|2002-12-27|2004-07-29|Canon Inc|Image processor and image reproducing device| JP2004208859A|2002-12-27|2004-07-29|Toshiba Corp|Ultrasonic diagnostic equipment| US6911008B2|2003-02-19|2005-06-28|Ultrasonix Medical Corporation|Compound ultrasound imaging method| US7270634B2|2003-03-27|2007-09-18|Koninklijke Philips Electronics N.V.|Guidance of invasive medical devices by high resolution three dimensional ultrasonic imaging| US7529393B2|2003-03-27|2009-05-05|Koninklijke Philips Electronics, N.V.|Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging| JP2004334843A|2003-04-15|2004-11-25|Seiko Epson Corp|Method of composting image from two or more images| US7346228B2|2003-09-09|2008-03-18|Ge Medical Systems Global Technology Company, Llc|Simultaneous generation of spatially compounded and non-compounded images| US7338448B2|2003-11-07|2008-03-04|Ge Medical Systems Global Technology Company, Llc|Method and apparatus for ultrasound compound imaging with combined fundamental and harmonic signals| JP2006150069A|2004-10-20|2006-06-15|Toshiba Corp|Ultrasonic diagnostic equipment, and control method therefor| JP4381344B2|2005-05-17|2009-12-09|ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー|Ultrasonic diagnostic equipment| US8147408B2|2005-08-31|2012-04-03|Sonosite, Inc.|Medical device guide locator| JP4891651B2|2006-05-11|2012-03-07|日立アロカメディカル株式会社|Ultrasonic diagnostic equipment| JP4996141B2|2006-06-12|2012-08-08|日立アロカメディカル株式会社|Ultrasonic diagnostic equipment| JP2008012150A|2006-07-07|2008-01-24|Toshiba Corp|Ultrasonic diagnostic equipment and control program of ultrasonic diagnostic equipment| KR100954988B1|2006-09-26|2010-04-29|주식회사 메디슨|Ultrasound system and method for forming ultrasound image| US8834372B2|2007-01-26|2014-09-16|Fujifilm Sonosite, Inc.|System and method for optimized spatio-temporal sampling| US9003321B2|2007-06-28|2015-04-07|International Business Machines Corporation|Application bar browsing of tabbed-view applications| US8137278B2|2007-09-12|2012-03-20|Sonosite, Inc.|System and method for spatial compounding using phased arrays| CN101438967B|2007-11-22|2012-09-05|Ge医疗系统环球技术有限公司|Ultrasonic imaging apparatus| WO2010018512A1|2008-08-12|2010-02-18|Koninklijke Philips Electronics N.V.|Ultrasound imaging| US8861822B2|2010-04-07|2014-10-14|Fujifilm Sonosite, Inc.|Systems and methods for enhanced imaging of objects within an image|US5238891A|1989-06-15|1993-08-24|Phillips Petroleum Company|Olefin polymerization catalyst and process| US8355554B2|2009-04-14|2013-01-15|Sonosite, Inc.|Systems and methods for adaptive volume imaging| US8861822B2|2010-04-07|2014-10-14|Fujifilm Sonosite, Inc.|Systems and methods for enhanced imaging of objects within an image| US9226729B2|2010-09-28|2016-01-05|Fujifilm Corporation|Ultrasound diagnostic system, ultrasound image generation apparatus, and ultrasound image generation method| CN102727257B|2011-03-31|2016-08-03|通用电气公司|Puncture needle method for visualizing and device| JP6000569B2|2011-04-01|2016-09-28|東芝メディカルシステムズ株式会社|Ultrasonic diagnostic apparatus and control program| WO2013031418A1|2011-08-30|2013-03-07|株式会社メガチップス|Device for detecting line segment and arc| CN105232047B|2011-09-06|2019-01-01|伊卓诺股份有限公司|Imaging probe and the method for obtaining position and/or directional information| KR101501517B1|2012-03-29|2015-03-11|삼성메디슨 주식회사|The method and apparatus for indicating a medical equipment on an ultrasound image| WO2014002963A1|2012-06-25|2014-01-03|株式会社東芝|Diagnostic ultrasound apparatus and image processing method| CN103889337B|2012-10-23|2016-11-02|东芝医疗系统株式会社|Diagnostic ultrasound equipment and ultrasonic diagnosis apparatus control method| KR101478623B1|2012-11-07|2015-01-02|삼성메디슨 주식회사|Ultrasound system and method for providing guide line of needle| KR20140066584A|2012-11-23|2014-06-02|삼성메디슨 주식회사|Ultrasound system and method for providing guide line of needle| CN103845075B|2012-11-30|2016-09-28|通用电气公司|Vltrasonic device and ultrasonic imaging method| JP5908852B2|2013-02-06|2016-04-26|ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー|Ultrasonic diagnostic apparatus and control program therefor| US9257220B2|2013-03-05|2016-02-09|Ezono Ag|Magnetization device and method| US9459087B2|2013-03-05|2016-10-04|Ezono Ag|Magnetic position detection system| GB201303917D0|2013-03-05|2013-04-17|Ezono Ag|System for image guided procedure| US9211110B2|2013-03-15|2015-12-15|The Regents Of The University Of Michigan|Lung ventillation measurements using ultrasound| CN105491955B|2013-08-30|2018-07-03|富士胶片株式会社|Diagnostic ultrasound equipment and method of generating ultrasonic image| JP6447071B2|2013-12-11|2019-01-09|コニカミノルタ株式会社|Ultrasonic diagnostic apparatus, ultrasonic image processing method, and program| US9622724B2|2014-03-31|2017-04-18|General Electric Company|Ultrasound imaging system and method for tracking a specular reflector| JP6799537B2|2014-12-09|2020-12-16|コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V.|Single modality-based visual identification of medical intervention devices from the organization| KR20170060852A|2015-11-25|2017-06-02|삼성메디슨 주식회사|Method and ultrasound apparatus for providing ultrasound images| JP2017099616A|2015-12-01|2017-06-08|ソニー株式会社|Surgical control device, surgical control method and program, and surgical system| JP6172330B2|2016-05-06|2017-08-02|コニカミノルタ株式会社|Ultrasound diagnostic imaging equipment| JP6812193B2|2016-10-07|2021-01-13|キヤノン株式会社|Image display system, image display method, and program| US10932749B2|2016-11-09|2021-03-02|Fujifilm Sonosite, Inc.|Ultrasound system for enhanced instrument visualization| US11039814B2|2016-12-04|2021-06-22|Exo Imaging, Inc.|Imaging devices having piezoelectric transducers| CN107997783B|2017-11-29|2021-03-30|声泰特(成都)科技有限公司|Self-adaptive ultrasonic beam synthesis method and system based on ultrasonic directionality| CN108056789A|2017-12-19|2018-05-22|飞依诺科技有限公司|A kind of method and apparatus for the configuration parameter value for generating ultrasound scanning device| CN112654294A|2018-09-14|2021-04-13|深圳迈瑞生物医疗电子股份有限公司|Blood vessel position display method and ultrasonic imaging system| US20210307726A1|2018-12-27|2021-10-07|Exo Imaging, Inc.|Methods to maintain image quality in ultrasound imaging at reduced cost, size, and power| US20200205773A1|2018-12-28|2020-07-02|UltraDiagnostics, Inc.|Ultrasound imaging system| US20200397511A1|2019-06-18|2020-12-24|Medtronic, Inc.|Ultrasound image-based guidance of medical instruments or devices| CN113795789A|2020-03-05|2021-12-14|艾科索成像公司|Ultrasound imaging apparatus with programmable anatomical and flow imaging|
法律状态:
2019-12-03| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2020-01-21| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-05-25| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-07-20| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 06/04/2011, OBSERVADAS AS CONDICOES LEGAIS. PATENTE CONCEDIDA CONFORME ADI 5.529/DF, QUE DETERMINA A ALTERACAO DO PRAZO DE CONCESSAO. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US32166610P| true| 2010-04-07|2010-04-07| US61/321,666|2010-04-07| US12/790,109|2010-05-28| US12/790,109|US8861822B2|2010-04-07|2010-05-28|Systems and methods for enhanced imaging of objects within an image| PCT/US2011/031447|WO2011127191A1|2010-04-07|2011-04-06|Systems and methods for enhanced imaging of objects within an image| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|